motor disability
This Startup Wants to Put Its Brain-Computer Interface in the Apple Vision Pro
California-based Cognixion is launching a clinical trial to allow paralyzed patients with speech disorders the ability to communicate without an invasive brain implant. The trials will be conducted with a modified version of the Apple Vision Pro headset. Startup Cognixion announced today that it is launching a clinical trial of its wearable brain-computer interface technology integrated with the Apple Vision Pro to help paralyzed people with speech disorders communicate with their thoughts. Cognixion is one of several companies, including Elon Musk's Neuralink, that is developing a brain-computer interface, or BCI, a system that captures brain signals and translates them into commands to control external devices. While Neuralink and others are working on implants that are surgically placed in the head, Cognixion's technology is noninvasive.
- North America > United States > California > Santa Barbara County > Santa Barbara (0.05)
- North America > United States > California > San Francisco County > San Francisco (0.05)
- Europe > Slovakia (0.05)
- Europe > Czechia (0.05)
Mind-Controlled Robots A Step Closer To Realization
Survivors of a severe brain or spinal cord injury are often left with a lifelong disability that impacts their lives negatively. A common problem faced by them is permanent paralysis due to the damage caused to their nervous system. The most severe form of paralysis is tetraplegia, as people suffering from it have lost control of both their arms and legs. Researchers have been working for years to build devices that tetraplegic patients can control using their thoughts and perform certain activities independently. Different institutions and organizations have been working on building seamless mind-controlled robots to perform various tasks.
- North America > United States > Minnesota (0.05)
- Europe > Switzerland > Zürich > Zürich (0.05)
This AI tool helps people with speech difficulties to communicate
A new AI tool can help people with speech difficulties to communicate by reducing the number of keystrokes they need to type. Researchers from the universities of Cambridge and Dundee developed the system for people with motor disabilities, who often use computers with a speech output to communicate. Unfortunately, these tools are generally slow and error-prone. Research shows that people normally type between five and 20 words per minute, but speak between 100 to 140 words per minute. As a result, people who rely on computers to communicate can struggle to have meaningful conversations. The new AI tool helps fill this communication gap by reducing the number of keystrokes they need to communicate.
Context-aware sentence retrieval method reduces 'communication gap' for nonverbal people
Researchers have used artificial intelligence to reduce the'communication gap' for nonverbal people with motor disabilities who rely on computers to converse with others. The team, from the University of Cambridge and the University of Dundee, developed a new context-aware method that reduces this communication gap by eliminating between 50% and 96% of the keystrokes the person has to type to communicate. "This method gives us hope for more innovative AI-infused systems to help people with motor disabilities to communicate in the future" – Per Ola Kristensson The system is specifically tailored for nonverbal people and uses a range of context'clues' – such as the user's location, the time of day or the identity of the user's speaking partner – to assist in suggesting sentences that are the most relevant for the user. Nonverbal people with motor disabilities often use a computer with speech output to communicate with others. However, even without a physical disability that affects the typing process, these communication aids are too slow and error-prone for meaningful conversation: typical typing rates are between five and 20 words per minute, while a typical speaking rate is in the range of 100 to 140 words per minute.
Artificial intelligence helps reduce 'communication gap' for nonverbal people
Researchers have used artificial intelligence to reduce the'communication gap' for nonverbal people with motor disabilities who rely on computers to converse with others. The team, from the University of Cambridge and the University of Dundee, developed a new context-aware method that reduces this communication gap by eliminating between 50% and 96% of the keystrokes the person has to type to communicate. The system is specifically tailed for nonverbal people and uses a range of context'clues' - such as the user's location, the time of day or the identity of the user's speaking partner - to assist in suggesting sentences that are the most relevant for the user. Nonverbal people with motor disabilities often use a computer with speech output to communicate with others. However, even without a physical disability that affects the typing process, these communication aids are too slow and error prone for meaningful conversation: typical typing rates are between five and 20 words per minute, while a typical speaking rate is in the range of 100 to 140 words per minute.
Robotics Legend Ayanna Howard On The Future Of Human-Robot Interactions
When roboticist Ayanna Howard was a little girl, she was inspired by TV to pursue a career in science. Growing up in the 1970s, she was particularly captivated by the TV show The Bionic Woman. "I wanted to be the bionic woman," she said. "The rest of my life has been about figuring out what that means." Today, the focus of her work is the way humans and robots work together to augment each other's capabilities.